Goto

Collaborating Authors

 reachable probability


Solving Risk-Sensitive POMDPs With and Without Cost Observations

Hou, Ping (New Mexico State University) | Yeoh, William (New Mexico State University) | Varakantham, Pradeep (Singapore Management University)

AAAI Conferences

Partially Observable Markov Decision Processes (POMDPs) are often used to model planning problems under uncertainty. The goal in Risk-Sensitive POMDPs (RS-POMDPs) is to find a policy that maximizes the probability that the cumulative cost is within some user-defined cost threshold. In this paper, unlike existing POMDP literature, we distinguish between the two cases of whether costs can or cannot be observed and show the empirical impact of cost observations. We also introduce a new search-based algorithm to solve RS-POMDPs and show that it is faster and more scalable than existing approaches in two synthetic domains and a taxi domain generated with real-world data.


Probabilistic Planning with Risk-Sensitive Criterion

Hou, Ping (New Mexico State University)

AAAI Conferences

While probabilistic planning models have been extensively used by AI and Decision Theoretic communities for planning under uncertainty, the objective to minimize the expected cumulative cost is inappropriate for high-stake planning problems. With this motivation in mind, we revisit the Risk-Sensitive criterion (RS-criterion), where the objective is to find a policy that maximizes the probability that the cumulative cost is within some user-defined cost threshold. The overall scope of this research is to develop efficient and scalable algorithms to optimize the RS-criterion in probabilistic planning problems. In our recent paper (Hou, Yeoh, and Varakantham 2014), we formally defined Risk-Sensitive MDPs (RS-MDPs) and introduced new algorithms for RS-MDPs with non-negative costs. Next, my plan is to develop algorithm for RS-MDPs with negative cost cycles and for Risk-Sensitive POMDPs (RS-POMDPs).